history conversation
Crisp: Cognitive Restructuring of Negative Thoughts through Multi-turn Supportive Dialogues
Zhou, Jinfeng, Chen, Yuxuan, Yin, Jianing, Huang, Yongkang, Shi, Yihan, Zhang, Xikun, Peng, Libiao, Zhang, Rongsheng, Lv, Tangjie, Hu, Zhipeng, Wang, Hongning, Huang, Minlie
Cognitive Restructuring (CR) is a psychotherapeutic process aimed at identifying and restructuring an individual's negative thoughts, arising from mental health challenges, into more helpful and positive ones via multi-turn dialogues. Clinician shortage and stigma urge the development of human-LLM interactive psychotherapy for CR. Yet, existing efforts implement CR via simple text rewriting, fixed-pattern dialogues, or a one-shot CR workflow, failing to align with the psychotherapeutic process for effective CR. To address this gap, we propose CRDial, a novel framework for CR, which creates multi-turn dialogues with specifically designed identification and restructuring stages of negative thoughts, integrates sentence-level supportive conversation strategies, and adopts a multi-channel loop mechanism to enable iterative CR. With CRDial, we distill Crisp, a large-scale and high-quality bilingual dialogue dataset, from LLM. We then train Crispers, Crisp-based conversational LLMs for CR, at 7B and 14B scales. Extensive human studies show the superiority of Crispers in pointwise, pairwise, and intervention evaluations.
- North America > United States > Florida > Miami-Dade County > Miami (0.14)
- North America > Canada (0.04)
- Asia > Thailand > Bangkok > Bangkok (0.04)
- (7 more...)
- Research Report > New Finding (1.00)
- Research Report > Experimental Study (1.00)
History-Aware Hierarchical Transformer for Multi-session Open-domain Dialogue System
Zhang, Tong, Liu, Yong, Li, Boyang, Zeng, Zhiwei, Wang, Pengwei, You, Yuan, Miao, Chunyan, Cui, Lizhen
With the evolution of pre-trained language models, current open-domain dialogue systems have achieved great progress in conducting one-session conversations. In contrast, Multi-Session Conversation (MSC), which consists of multiple sessions over a long term with the same user, is under-investigated. In this paper, we propose History-Aware Hierarchical Transformer (HAHT) for multi-session open-domain dialogue. HAHT maintains a long-term memory of history conversations and utilizes history information to understand current conversation context and generate well-informed and context-relevant responses. Specifically, HAHT first encodes history conversation sessions hierarchically into a history memory. Then, HAHT leverages historical information to facilitate the understanding of the current conversation context by encoding the history memory together with the current context with attention-based mechanisms. Finally, to explicitly utilize historical information, HAHT uses a history-aware response generator that switches between a generic vocabulary and a history-aware vocabulary. Experimental results on a large-scale MSC dataset suggest that the proposed HAHT model consistently outperforms baseline models. Human evaluation results support that HAHT generates more human-like, context-relevant and history-relevant responses than baseline models.
- Asia > Singapore (0.05)
- Asia > China (0.04)
- North America > United States > New York > New York County > New York City (0.04)
- (9 more...)